Sequential Testing for Sparse Recovery
نویسندگان
چکیده
منابع مشابه
Interpretable Recurrent Neural Networks Using Sequential Sparse Recovery
Recurrent neural networks (RNNs) are powerful and effective for processing sequential data. However, RNNs are usually considered “black box” models whose internal structure and learned parameters are not interpretable. In this paper, we propose an interpretable RNN based on the sequential iterative soft-thresholding algorithm (SISTA) for solving the sequential sparse recovery problem, which mod...
متن کاملSparse Recovery
List of included articles [1] H. Rauhut. Random sampling of sparse trigonometric polynomials. Appl. Comput. [2] S. Kunis and H. Rauhut. Random sampling of sparse trigonometric polynomials II-orthogonal matching pursuit versus basis pursuit. [3] H. Rauhut. Stability results for random sampling of sparse trigonometric polynomi-als. [4] H. Rauhut. On the impossibility of uniform sparse reconstruct...
متن کاملAdaptive Sampling for Sparse Recovery
Consider n data sequences, each consisting of independent and identically distributed elements drawn from one of the two possible zero-mean Gaussian distributions with variances A0 and A1. The problem of quickly identifying all of the sequences with varianceA1 is considered and an adaptive two-stage experimental design and testing procedure is proposed. The agility and reliability gains in comp...
متن کاملNumerical methods for sparse recovery
These lecture notes are an introduction to methods recently developed for performing numerical optimizations with linear model constraints and additional sparsity conditions to solutions, i.e. we expect solutions which can be represented as sparse vectors with respect to a prescribed basis. Such a type of problems has been recently greatly popularized by the development of the field of nonadapt...
متن کاملSequential Sparse NMF
Nonnegative Matrix Factorization (NMF) is a standard tool for data analysis. An important variant is the Sparse NMF problem. A natural measure of sparsity is the L0 norm, however its optimization is NP-hard. Here, we consider a sparsity measure linear in the ratio of the L1 and L2 norms, and propose an efficient algorithm to handle the norm constraints which arise when optimizing this measure. ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Information Theory
سال: 2014
ISSN: 0018-9448,1557-9654
DOI: 10.1109/tit.2014.2363846